Fast Neural Net Simulation with

نویسنده

  • Anton Gunzinger
چکیده

This paper describes the implementation of a fast neural net simulator on a novel parallel distributed-memory computer. A 60-processor system, named MUSIC, 1 is operational and runs the back-propagation algorithm at a speed of 247 million connection updates per second (continuous weight update) using 32 bit oating-point precision. This is equal to 1 GGops sustained performance. The complete system with 3.6 GGops peak performance consumes less than 800 Watt of electrical power and ts into a 19-inch rack. While reaching the speed of modern supercomputers, MUSIC still can be used as a personal desktop computer at a researchers own disposal. In neural net simulation, this gives a computing performance to a single user which was unthinkable before. The system's real-time interfaces make it especially useful for embedded applications.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Gated Fast Weights for Associative Retrieval

We improve previous end-to-end differentiable neural networks (NNs) with fast weight memories. A gate mechanism updates fast weights at every time step of a sequence through two separate outer-product-based matrices generated by slow parts of the net. The system is trained on a complex sequence to sequence variation of the Associative Retrieval Problem with roughly 70 times more temporal memory...

متن کامل

Context in temporal sequence processing: a self-organizing approach and its application to robotics

A self-organizing neural net for learning and recall of complex temporal sequences is developed and applied to robot trajectory planning. We consider trajectories with both repeated and shared states. Both cases give rise to ambiguities during reproduction of stored trajectories which are resolved via temporal context information. Feedforward weights encode spatial features of the input traject...

متن کامل

The ANNIGMA-wrapper approach to fast feature selection for neural nets

This paper presents a novel feature selection approach for backpropagation neural networks (NNs). Previously, a feature selection technique known as the wrapper model was shown effective for decision trees induction. However, it is prohibitively expensive when applied to real-world neural net training characterized by large volumes of data and many feature choices. Our approach incorporates a w...

متن کامل

Real-time Scheduling of a Flexible Manufacturing System using a Two-phase Machine Learning Algorithm

The static and analytic scheduling approach is very difficult to follow and is not always applicable in real-time. Most of the scheduling algorithms are designed to be established in offline environment. However, we are challenged with three characteristics in real cases: First, problem data of jobs are not known in advance. Second, most of the shop’s parameters tend to be stochastic. Third, th...

متن کامل

Fast computation of a gated dipole field

We address the need to develop efficient algorithms for numerical simulation of models, based in part or entirely on adaptive resonance theory. We introduce modifications that speed up the computation of the gated dipole field (GDF) in the Exact ART neural network. The speed increase of our solution amounts to at least an order of magnitude for fields with more than 100 gated dipoles. We adopt ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1993